Removing L2-norm in contrastive loss (L2-norm already present in CosSim)#6550
Removing L2-norm in contrastive loss (L2-norm already present in CosSim)#6550wyli merged 2 commits intoProject-MONAI:devfrom
Conversation
…ne-similarity computation) Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr>
0a1cc81 to
34e84a3
Compare
|
/build |
|
Hi @Lucas-rbnt, when the L2 norms are large values (e.g. for high dimensional embeddings) do you think this might be less stable numerically, have you tested this PR in end-to-end trainings? |
seems to be addressed in pytorch 1.12 for the same pytorch/pytorch@9e137ee |
wyli
left a comment
There was a problem hiding this comment.
Thanks, it looks good to me. (probably less stable for early versions of pytorch but more efficient for the recent versions)
wyli
left a comment
There was a problem hiding this comment.
Thanks, it looks good to me. (probably less stable for early versions of pytorch but more efficient for the recent versions)
|
Hi @wyli! I am currently running a SimCLR training on BraTS21 data centered on tumor to compare before and after the commit. |
|
/build |
Description
The
forwardmethod of theContrastiveLossperforms L2-normalization before computing cosine similarity. Thetorch.nn.functional.cosine_similaritymethod already handles this pre-processing to make sure thatinputandtargetlie on the surface of the unit hypersphere. This step involves an unnecessary cost and, thus, can be removed.Types of changes
./runtests.sh -f -u --net --coverage../runtests.sh --quick --unittests --disttests.